首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   322篇
  免费   14篇
  国内免费   2篇
财政金融   77篇
工业经济   10篇
计划管理   130篇
经济学   48篇
综合类   17篇
运输经济   1篇
旅游经济   1篇
贸易经济   28篇
农业经济   8篇
经济概况   18篇
  2024年   1篇
  2023年   9篇
  2022年   6篇
  2021年   8篇
  2020年   19篇
  2019年   11篇
  2018年   14篇
  2017年   16篇
  2016年   14篇
  2015年   15篇
  2014年   18篇
  2013年   32篇
  2012年   23篇
  2011年   23篇
  2010年   28篇
  2009年   20篇
  2008年   13篇
  2007年   12篇
  2006年   10篇
  2005年   8篇
  2004年   5篇
  2003年   4篇
  2002年   9篇
  2001年   3篇
  2000年   2篇
  1999年   3篇
  1998年   3篇
  1997年   1篇
  1995年   1篇
  1994年   1篇
  1993年   1篇
  1992年   1篇
  1990年   2篇
  1987年   1篇
  1985年   1篇
排序方式: 共有338条查询结果,搜索用时 218 毫秒
1.
The reproducibility crisis, that is, the fact that many scientific results are difficult to replicate, pointing to their unreliability or falsehood, is a hot topic in the recent scientific literature, and statistical methodologies, testing procedures and p‐values, in particular, are at the centre of the debate. Assessment of the extent of the problem–the reproducibility rate or the false discovery rate–and the role of contributing factors are still an open problem. Replication experiments, that is, systematic replications of existing results, may offer relevant information on these issues. We propose a statistical model to deal with such information, in particular to estimate the reproducibility rate and the effect of some study characteristics on its reliability. We analyse data from a recent replication experiment in psychology finding a reproducibility rate broadly coherent with other assessments from the same experiment. Our results also confirm the expected role of some contributing factor (unexpectedness of the result and room for bias) while they suggest that the similarity between original study and the replica is not so relevant, thus mitigating some criticism directed to replication experiments.  相似文献   
2.
3.
Motivated by the implied stochastic volatility literature (Britten–Jones and Neuberger, forthcoming; Derman and Kani, 1997; Ledoit and Santa–Clara, 1998) this paper proposes a new and general method for constructing smile–consistent stochastic volatility models. The method is developed by recognising that option pricing and hedging can be accomplished via the simulation of the implied risk neutral distribution. We devise an algorithm for the simulation of the implied distribution, when the first two moments change over time. The algorithm can be implemented easily, and it is based on an economic interpretation of the concept of mixture of distributions. It can also be generalised to cases where more complicated forms for the mixture are assumed.  相似文献   
4.
Our paper provides a brief review and summary of issues and advances in the use of latent structure and other finite mixture models in the analysis of choice data. Focus is directed to three primary areas: (1) estimation and computational issues, (2) specification and interpretation issues, and (3) future research issues. We comment on what latent structure models have promised, what has been, to date, delivered, and what we should look forward to in the future.  相似文献   
5.
Wu  Jong-Wuu  Lee  Wen-Chuan  Tsai  Hui-Yin 《Quality and Quantity》2002,36(3):311-323
In recent papers, Moon and Choi (1998) and Hariga and Ben-Daya (1999)considered a continuous review inventory model with a mixture of backordersand lost sales in which the lead time, the order quantity, and the reorder pointare decision variables was studied. Moreover, they also develop a minimaxdistribution free procedure for the problem. While the demands of differentcustomers are not identical in the lead time, then we can't only use a singledistribution (such as Moon & Choi (1998) and Hariga & Ben-Daya (1999))to describe the demand of the lead time. Hence, we correct and extend the modelof Moon and Choi (1998) and Hariga and Ben-Daya (1999) by considering thelead time demand with the mixture of distributions. In addition, we also applythe minimax mixture of distributions free approach to the model by simultaneouslyoptimizing the order quantity, the reorder point, and the lead time to devise a practical procedure which can be used without specific information on demand distribution.  相似文献   
6.
蒋玲 《技术经济》2007,26(11):107-109120
第二次世界大战以后,经济学家在对传统经济进行反思的过程中,开始从地下经济角度来解释现实经济现象中宏观经济预测的偏离及菲力普斯曲线开始向上和向外盘旋现象。地下经济一般被认为是由那些企业或个人防止和逃避政府的管制、税收和监察的经济活动和由此获得的收入构成的。根据政府作为政策制定者和观察者及受调查者即企业或个人之间的关系,本文提供了一个相互作用系统的样板,通过政府和企业及个人之间在交税行为过程中产生的信号传递反馈系统,拟用一种不完全信息的动态博弈分析方法,论证得出信息系统的完整化、公开化、政府法律政策严明性及政府的相机抉择制度对社会经济发展的重要作用。  相似文献   
7.
This study analyzes sovereign risk contagion between four East Asian economies (China, Hong Kong, Japan, and Korea) and its structural changes through the Global Financial Crisis (GFC) and the European Debt Crisis (EDC) by applying the mixture of time-varying copulas to those economies’ credit default swap (CDS) spreads.

This article first finds a strong contagion from the US and PIIGS economies to the East Asian sovereign CDS markets and intraregional contagion within the East Asian markets. Second, the impact of contagion is different according to whether it is measured by the linear (Gaussian) or the upper tail dependence. Third, Japan plays an important role in increasing the linear dependence whereas China and Korea are crucial in terms of the upper tail dependence. Lastly, the GFC has structurally increased the linear dependence but not the upper tail dependence between the East Asian sovereign CDS markets.  相似文献   

8.
Credit derivatives pricing models before Basel III ignored losses in market value stemming from higher probability of counterparty default. We propose a general credit derivatives pricing model to evaluate a Credit Default Swap (CDS) with counterparty risk, including the Credit Valuation Adjustment (CVA) in order to optimize the economic capital allocation. We work from the model proposed by Luciano (2003, Working Paper, International Center of Economic Research) and the general pricing representation established by Sorensen and Bollier (Financial Analysts Journal 1994;50(3):23–33) to provide a model close to the market practice, easy to implement and fitting with Basel III framework. We approach the dependence between counterparty risk and that of the reference entity with a technical tool: the copula, in particular, the mixture one that combines common “extreme” copulas. We study the CDS's vulnerability in extreme dependence cases. By varying Spearman's rho, the mixture copula covers a broad spectrum of dependence and ensures closed form prices. We end up with an application on real market data.  相似文献   
9.
The finance literature provides ample evidence that diversification benefits hinges on dependence between assets returns. A notable feature of the recent financial crisis is the extent to which assets that had hitherto moved mostly independently suddenly moved together resulting in joint losses in most advanced markets. This provides grounds to uncover the relative potential of African markets to provide diversification benefits by means of their correlation with advanced markets. Therefore, we examine the dependence structure between advanced and emerging African stock markets using copulas. Several findings are documented. First, dependence is time-varying and weak for most African markets, except South Africa. Second, we find evidence of asymmetric dependence, suggesting that stock return comovement varies in bearish and bullish markets. Third, extreme downward stock price movements in the advanced markets do not have significant spillover effects on Africa’s emerging stock markets. Our results, implying that African markets, with the exception of South Africa, are immune to risk spillover from advanced markets, improves the extant literature and have implications for portfolio diversification and risk management.  相似文献   
10.
We review a rich class of point process models, Cox point processes, and illustrate the necessity of more than one observation (point patterns) in performing parameter estimation. Furthermore, we introduce a new Cox point process model by treating the intensity function of the underlying Poisson point process as a random mixture of normal components. The behaviour and performance of the new model are compared with those of popular Cox point process models. The new model is exemplified with an application that involves a single point pattern corresponding to earthquake events in California, USA.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号